Goto

Collaborating Authors

 gran turismo


Out-of-Distribution Generalization with a SPARC: Racing 100 Unseen Vehicles with a Single Policy

Grooten, Bram, MacAlpine, Patrick, Subramanian, Kaushik, Stone, Peter, Wurman, Peter R.

arXiv.org Artificial Intelligence

Generalization to unseen environments is a significant challenge in the field of robotics and control. In this work, we focus on contextual reinforcement learning, where agents act within environments with varying contexts, such as self-driving cars or quadrupedal robots that need to operate in different terrains or weather conditions than they were trained for. We tackle the critical task of generalizing to out-of-distribution (OOD) settings, without access to explicit context information at test time. Recent work has addressed this problem by training a context encoder and a history adaptation module in separate stages. While promising, this two-phase approach is cumbersome to implement and train. We simplify the methodology and introduce SP ARC: single-phase adaptation for robust control. We test SP ARC on varying contexts within the high-fidelity racing simulator Gran Turismo 7 and wind-perturbed MuJoCo environments, and find that it achieves reliable and robust OOD generalization.


I played Horizon Zero Dawn inside Sony and Honda's Afeela concept EV at CES 2024

Engadget

A year since Sony Honda Mobility (SHM) announced its debut EV concept, the Afeela, the company is back at CES 2024 to offer more details, more collaborations and a driving simulator. The name of the concept vehicle hasn't changed since last we saw it. What is new, however, is the car's ability to be driven around with a PlayStation controller. I didn't get to do that -- it was a stunt operated by one of the company's employees -- but there was a DualSense controller involved in my demo. So let's begin where SHM left off.


AiThority Interview with Pete Wurman, Director at Sony AI America

#artificialintelligence

I got my undergraduate degree from MIT in mechanical engineering. I didn't feel ready to be an engineer, so I went to the University of Michigan to get a Masters degree in mechanical engineering. Along the way, I got a job programming at the university, and eventually decided to go back to school and get a Ph.D. in computer science. From there, I became a professor in the Computer Science Department at North Carolina State. In 2004, as I went up for tenure, my roommate from my undergraduate days at MIT came up with an idea for a robotic warehouse system, and convinced me to help him start what became Kiva Systems.


Sony's racing AI destroyed its human competitors by being nice (and fast)

MIT Technology Review

But Sony soon learned that speed alone wasn't enough to make GT Sophy a winner. The program outpaced all human drivers on an empty track, setting superhuman lap times across three different virtual courses. Yet when Sony tested GT Sophy in a race against multiple human drivers, where intelligence as well as speed is needed, GT Sophy lost. The program was at times too aggressive, racking up penalties for reckless driving, and too timid, giving way when it didn't need to. Sony regrouped, retrained its AI, and set up a rematch in October.


AI Outraces Human Champs at the Video Game Gran Turismo

#artificialintelligence

To hurtle around a corner along the fastest "racing line" without losing control, race car drivers must brake, steer and accelerate in precisely timed sequences. The process depends on the limits of friction, and they are governed by known physical laws--which means self-driving cars can learn to complete a lap at the fastest possible speed (as some have already done). But this becomes a much knottier problem when the automated driver has to share space with other cars. Now scientists have unraveled the challenge virtually by training an artificial intelligence program to outpace human competitors at the ultrarealistic racing game Gran Turismo Sport. The findings could point self-driving car researchers toward new ways to make this technology function in the real world.


Sony's new AI driver achieves "reliably superhuman" race times in Gran Turismo

#artificialintelligence

AI agents have bested humans at many games, from chess to Go to poker. Now, the machines can claim a new high score on the classic racing video game series Gran Turismo. Sony announced today that its researchers have developed an AI driver named GT Sophy that is "reliably superhuman" -- able to beat top human drivers in Gran Turismo Sport in back-to-back laps. You might think this an easy challenge. After all, isn't racing simply a matter of speed and reaction time and therefore simple for a machine to master?


Sony's AI can employ clever strategies to beat the best humans at 'Gran Turismo' - SiliconANGLE

#artificialintelligence

Sony Corp. announced today that it has created an artificial intelligence that can get the better of humans when playing the simulation game "Gran Turismo," which could have implications in the future for self-driving technology. AI has already mastered such games as "Go" and chess, and according to Sony, it took just two days of training for the technology Gran Turismo Sophy or GT Sophy to leave human players in the dust. At the two-day mark it was beating 95% of the best human players, and in the following days kept shaving time from its previous results. Sony said the AI mastered a number of tracks, not only by figuring out when to slow down or accelerate but using tactics such as when the time is right to get in behind a car and use the slipstream. When it became obvious that the car wasn't using a good racing line, it would change to another line.


Sony's new AI beats humans in Gran Turismo racing game

The Japan Times

Sony Group Corp. said on Wednesday it had created an artificial intelligence agent called Gran Turismo Sophy (GT Sophy) that was able to beat world's best drivers of the PlayStation racing simulation game Gran Turismo. To get GT Sophy ready for the game, different units of Sony brought in fundamental AI research, a hyper-realistic real world racing simulator, and infrastructure for massive scale AI training, the company said in a statement. The AI first raced against four top Gran Turismo drivers in July, learned from the race and outperformed the human drivers in another race in October. "It took about 20 PlayStations running simultaneously for about 10 to 12 days to train GT Sophy to race from scratch to superhuman level," said Peter Wurman, director of Sony AI America and the leader of the team who designed the AI. While AI had been used to defeat humans in the games of chess, mahjong and go, Sony said the difficulty in mastering race car driving was the many decisions that need to be made in real time.


Sony's Sophy racing AI beats Gran Turismo's top talent

Engadget

Hyper-capable AIs have been beating us at our own games for years. Whether it's Go or Jeopardy, DOTA 2 or Nethack, artificial intelligences have routinely proven themselves superior competitors, helping advance not only the state of gaming arts but also those of machine learning and computational science as well. On Wednesday, Sony announced its latest addition to the field, GT Sophy, an AI racer capable of taking on -- and beating -- some of the world's best Gran Turismo players. GT Sophy (the GT stands for "Gran Turismo") is the result of a collaboration between Sony AI, Polyphony Digital (PDI) and Sony Interactive Entertainment (SIE), as well as more than half a decade of research and development. "Gran Turismo Sophy is a significant development in AI whose purpose is not simply to be better than human players, but to offer players a stimulating opponent that can accelerate and elevate the players' techniques and creativity to the next level," Sony AI CEO, Hiroaki Kitano, said in a statement Wednesday.


AI driver can beat some of the world's best players at Gran Turismo

New Scientist

An artificial intelligence has beaten four of the world's best human drivers on three different tracks in the racing video game Gran Turismo Sport, by gaining ground at the most difficult parts of a track. The AI, named GT Sophy, was able to execute tactical moves such as using an opponent's slipstream to boost itself forwards and block its opponents from passing. Peter Wurman at Sony AI in New York and his colleagues trained the system using deep reinforcement learning, a type of machine learning that uses rewards and penalties to teach the AI's neural network how to win. During training, GT Sophy, which was running on a separate computer, played the game on up to 20 PlayStation 4 consoles simultaneously. The team gave the AI the ability to accelerate, brake and steer, along with real-time information on the position of the cars in the game, including its own, and a map of the next 6 seconds of the track, which meant sight of a longer distance ahead when the AI travelled faster.